Stochastic Neural Networks with Monotonic Activation Functions
نویسندگان
چکیده
We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.
منابع مشابه
Neural networks with periodic and monotonic activation functions: a comparative study in classi cation problems
This article discusses a number of reasons why the use of non-monotonic functions as activation functions can lead to a marked improvement in the performance of a neural network. Using a wide range of benchmarks we show that a multilayer feed-forward network using sine activation functions (and an appropriate choice of initial parameters) learns much faster than one incorporating sigmoid functi...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملAsymptotic stability for neural networks with mixed time-delays: The discrete-time case
This paper is concerned with the stability analysis problem for a new class of discrete-time recurrent neural networks with mixed time-delays. The mixed time-delays that consist of both the discrete and distributed time-delays are addressed, for the first time, when analyzing the asymptotic stability for discrete-time neural networks. The activation functions are not required to be differentiab...
متن کاملTaming the Waves: Sine as Activation Function in Deep Neural Networks
Most deep neural networks use non-periodic and monotonic—or at least quasiconvex— activation functions. While sinusoidal activation functions have been successfully used for specific applications, they remain largely ignored and regarded as difficult to train. In this paper we formally characterize why these networks can indeed often be difficult to train even in very simple scenarios, and desc...
متن کاملHandwritten Digit Recognition Using Multi-Layer Feedforward Neural Networks with Periodic and Monotonic Activation Functions
The problem of handwritten digit recognition is tackled by multi-layer feedforward neural networks with different types of neuronal activation functions. Three types of activation functions are adopted in the network, namely, the traditional sigmoid function, the sinusoidal function and a periodic function that can be considered as a combination of the first two functions. To speed up the learn...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016